13 research outputs found

    Decentralized energy efficient model for data transmission in IoT-based healthcare system

    Get PDF
    The growing world population is facing challenges such as increased chronic diseases and medical expenses. Integrate the latest modern technology into healthcare system can diminish these issues. Internet of medical things (IoMT) is the vision to provide the better healthcare system. The IoMT comprises of different sensor nodes connected together. The IoMT system incorporated with medical devices (sensors) for given the healthcare facilities to the patient and physician can have capability to monitor the patients very efficiently. The main challenge for IoMT is the energy consumption, battery charge consumption and limited battery lifetime in sensor based medical devices. During charging the charges that are stored in battery and these charges are not fully utilized due to nonlinearity of discharging process. The short time period needed to restore these unused charges is referred as recovery effect. An algorithm exploiting recovery effect to extend the battery lifetime that leads to low consumption of energy. This paper provides the proposed adaptive Energy efficient (EEA) algorithm that adopts this effect for enhancing energy efficiency, battery lifetime and throughput. The results have been simulated on MATLAB by considering the Li-ion battery. The proposed adaptive Energy efficient (EEA) algorithm is also compared with other state of the art existing method named, BRLE. The Proposed algorithm increased the lifetime of battery, energy consumption and provides the improved performance as compared to BRLE algorithm. It consumes low energy and supports continuous connectivity of devices without any loss/interruptions

    Recent Advances in Deep Learning Techniques for Face Recognition

    Full text link
    In recent years, researchers have proposed many deep learning (DL) methods for various tasks, and particularly face recognition (FR) made an enormous leap using these techniques. Deep FR systems benefit from the hierarchical architecture of the DL methods to learn discriminative face representation. Therefore, DL techniques significantly improve state-of-the-art performance on FR systems and encourage diverse and efficient real-world applications. In this paper, we present a comprehensive analysis of various FR systems that leverage the different types of DL techniques, and for the study, we summarize 168 recent contributions from this area. We discuss the papers related to different algorithms, architectures, loss functions, activation functions, datasets, challenges, improvement ideas, current and future trends of DL-based FR systems. We provide a detailed discussion of various DL methods to understand the current state-of-the-art, and then we discuss various activation and loss functions for the methods. Additionally, we summarize different datasets used widely for FR tasks and discuss challenges related to illumination, expression, pose variations, and occlusion. Finally, we discuss improvement ideas, current and future trends of FR tasks.Comment: 32 pages and citation: M. T. H. Fuad et al., "Recent Advances in Deep Learning Techniques for Face Recognition," in IEEE Access, vol. 9, pp. 99112-99142, 2021, doi: 10.1109/ACCESS.2021.309613

    A Blockchain-Based Trust Model for the Internet of Things Supply Chain Management

    No full text
    Accurate data and strategic business processes are crucial to all parties in a supply chain system. However, the absence of mutual trust can create a barrier to implementation. Several studies have shown that supply chains face challenges arising from a lack of trust with respect to the sharing of data. How well each party trusts the data they receive can have a profound influence on management decisions. Blockchain technology has been widely used to process cryptocurrency transactions. Recently, it has also proved to be effective in creating trust in the Internet of things (IoT) domain. Blockchain technology can facilitate mutual trust between parties who would otherwise have been doubtful of each other’s data, allowing for more effective and secure sharing of data. However, if the blockchain is not IoT-optimized, companies can experience significant delays and the need for extensive computational capacity. Moreover, there are still some limitations regarding the consensus between the nodes in the traditional consensus approaches. Here, we propose an alternative approach to creating trust in supply chains with diverse IoT elements. Our streamlined trust model simplifies data sharing and reduces computational, storage, and latency requirements while increasing the security of the IoT-based supply chain management. We evaluate the suggested model using simulations and highlight its viability

    A Blockchain-Based Trust Model for the Internet of Things Supply Chain Management

    No full text
    Accurate data and strategic business processes are crucial to all parties in a supply chain system. However, the absence of mutual trust can create a barrier to implementation. Several studies have shown that supply chains face challenges arising from a lack of trust with respect to the sharing of data. How well each party trusts the data they receive can have a profound influence on management decisions. Blockchain technology has been widely used to process cryptocurrency transactions. Recently, it has also proved to be effective in creating trust in the Internet of things (IoT) domain. Blockchain technology can facilitate mutual trust between parties who would otherwise have been doubtful of each other’s data, allowing for more effective and secure sharing of data. However, if the blockchain is not IoT-optimized, companies can experience significant delays and the need for extensive computational capacity. Moreover, there are still some limitations regarding the consensus between the nodes in the traditional consensus approaches. Here, we propose an alternative approach to creating trust in supply chains with diverse IoT elements. Our streamlined trust model simplifies data sharing and reduces computational, storage, and latency requirements while increasing the security of the IoT-based supply chain management. We evaluate the suggested model using simulations and highlight its viability

    Lies Kill, Facts Save: Detecting COVID-19 Misinformation in Twitter

    No full text
    Online social networks (ONSs) such as Twitter have grown to be very useful tools for the dissemination of information. However, they have also become a fertile ground for the spread of false information, particularly regarding the ongoing coronavirus disease 2019 (COVID-19) pandemic. Best described as an infodemic, there is a great need, now more than ever, for scientific fact-checking and misinformation detection regarding the dangers posed by these tools with regards to COVID-19. In this article, we analyze the credibility of information shared on Twitter pertaining the COVID-19 pandemic. For our analysis, we propose an ensemble-learning-based framework for verifying the credibility of a vast number of tweets. In particular, we carry out analyses of a large dataset of tweets conveying information regarding COVID-19. In our approach, we classify the information into two categories: credible or non-credible. Our classifications of tweet credibility are based on various features, including tweet- and user-level features. We conduct multiple experiments on the collected and labeled dataset. The results obtained with the proposed framework reveal high accuracy in detecting credible and non-credible tweets containing COVID-19 information

    On the Security and Privacy Challenges of Virtual Assistants

    Get PDF
    Since the purchase of Siri by Apple, and its release with the iPhone 4S in 2011, virtualassistants (VAs) have grown in number and popularity. The sophisticated natural language processingand speech recognition employed by VAs enables users to interact with them conversationally, almostas they would with another human. To service user voice requests, VAs transmit large amounts ofdata to their vendors; these data are processed and stored in the Cloud. The potential data securityand privacy issues involved in this process provided the motivation to examine the current state ofthe art in VA research. In this study, we identify peer-reviewed literature that focuses on securityand privacy concerns surrounding these assistants, including current trends in addressing how voiceassistants are vulnerable to malicious attacks and worries that the VA is recording without the user’sknowledge or consent. The findings show that not only are these worries manifold, but there is agap in the current state of the art, and no current literature reviews on the topic exist. This reviewsheds light on future research directions, such as providing solutions to perform voice authenticationwithout an external device, and the compliance of VAs with privacy regulations

    Deep Learning-based Smart IoT Health System for Blindness Detection using Retina Images

    No full text
    Publisher Copyright: CCBY Copyright: Copyright 2021 Elsevier B.V., All rights reserved.Deep Learning-based Smart Healthcare is getting so much attention due to real-time applicability in everyone life's, and It has obtained more attention with the convergence of IoT. Diabetic eye disease is the primary cause of blindness between working aged peoples. The major populated Asian countries such as India and China presently account for millions of people and at the verge of an eruption of diabetic inhabitants. These growing number of diabetic patients posed a major challenge among trained doctors to provide medical screening and diagnosis. Our goal is to leverage the deep learning techniques to automate the detection of blind spot in an eye and identify how severe the stage may be. In this paper, we propose an optimized technique on top of recently released pre-trained EfficientNet models for blindness identification in retinal images along with a comparative analysis among various other neural network models. Our fine-tuned EfficientNet-B5 based model evaluation follows the benchmark dataset of retina images captured using fundus photography during varied imaging stages and outperforms CNN and ResNet50 models.Peer reviewe

    A Novel Adaptive Battery-Aware Algorithm for Data Transmission in IoT-Based Healthcare Applications

    No full text
    The internet of things (IoT) comprises various sensor nodes for monitoring physiological signals, for instance, electrocardiogram (ECG), electroencephalogram (EEG), blood pressure, and temperature, etc., with various emerging technologies such as Wi-Fi, Bluetooth and cellular networks. The IoT for medical healthcare applications forms the internet of medical things (IoMT), which comprises multiple resource-restricted wearable devices for health monitoring due to heterogeneous technological trends. The main challenge for IoMT is the energy drain and battery charge consumption in the tiny sensor devices. The non-linear behavior of the battery uses less charge; additionally, an idle time is introduced for optimizing the charge and battery lifetime, and hence the efficient recovery mechanism. The contribution of this paper is three-fold. First, a novel adaptive battery-aware algorithm (ABA) is proposed, which utilizes the charges up to its maximum limit and recovers those charges that remain unused. The proposed ABA adopts this recovery effect for enhancing energy efficiency, battery lifetime and throughput. Secondly, we propose a novel framework for IoMT based pervasive healthcare. Thirdly, we test and implement the proposed ABA and framework in a hardware platform for energy efficiency and longer battery lifetime in the IoMT. Furthermore, the transition of states is modeled by the deterministic mealy finite state machine. The Convex optimization tool in MATLAB is adopted and the proposed ABA is compared with other conventional methods such as battery recovery lifetime enhancement (BRLE). Finally, the proposed ABA enhances the energy efficiency, battery lifetime, and reliability for intelligent pervasive healthcar

    Detection and Classification of Psychopathic Personality Trait from Social Media Text Using Deep Learning Model

    No full text
    Nowadays, there is a digital era, where social media sites like Facebook, Google, Twitter, and YouTube are used by the majority of people, generating a lot of textual content. The user-generated textual content discloses important information about people’s personalities, identifying a special type of people known as psychopaths. The aim of this work is to classify the input text into psychopath and nonpsychopath traits. Most of the existing work on psychopath’s detection has been performed in the psychology domain using traditional approaches, like SRPIII technique with limited dataset size. Therefore, it motivates us to build an advanced computational model for psychopath’s detection in the text analytics domain. In this work, we investigate an advanced deep learning technique, namely, attention-based BILSTM for psychopath’s detection with an increased dataset size for efficient classification of the input text into psychopath vs. nonpsychopath classes

    Optimized Feature Learning for Anti-Inflammatory Peptide Prediction Using Parallel Distributed Computing

    No full text
    With recent advancements in computational biology, high throughput Next-Generation Sequencing (NGS) has become a de facto standard technology for gene expression studies, including DNAs, RNAs, and proteins; however, it generates several millions of sequences in a single run. Moreover, the raw sequencing datasets are increasing exponentially, doubling in size every 18 months, leading to a big data issue in computational biology. Moreover, inflammatory illnesses and boosting immune function have recently attracted a lot of attention, yet accurate recognition of Anti-Inflammatory Peptides (AIPs) through a biological process is time-consuming as therapeutic agents for inflammatory-related diseases. Similarly, precise classification of these AIPs is challenging for traditional technology and conventional machine learning algorithms. Parallel and distributed computing models and deep neural networks have become major computing platforms for big data analytics now required in computational biology. This study proposes an efficient high-throughput anti-inflammatory peptide predictor based on a parallel deep neural network model. The model performance is extensively evaluated regarding performance measurement parameters such as accuracy, efficiency, scalability, and speedup in sequential and distributed environments. The encoding sequence data were balanced using the SMOTETomek approach, resulting in a high-accuracy performance. The parallel deep neural network demonstrated high speed up and scalability compared to other traditional classification algorithms study’s outcome could promote a parallel-based model for predicting anti-Inflammatory Peptides
    corecore